Interface and Application Programming

This week's assignment was to create an interface that could communicate with an input or output device. For this week's group assignment we are comparing different interface tools.

  • FlutterFlow, which was used by Victor
  • AppInventor, which was used in other years here in Puebla
  • QTDesigner, which was used by Emilio
  • Processing, which has been used in years past in Puebla
For my final project, I will need to have an interface that can control three main things:

Camera Movement

To adjust the camera position in my final project, I want to have a NEMA 17 to operate a screw and nut system that will hold a camera. The user must be able to control the movement of the camera to be either right or left. This movement will be set to move for an arbitrary number of seconds.

LED Light

The interface will be able to turn on and off an LED strip that will provide contrast on the acrylic plate and the pressure the patient inflects, ensuring an adequate image is provided for future use in image processing software. This LED strip will toggle between both states (on/off).

Camera Input

A webcam will be placed on top of the camera mechanism and will have a live feed that transmits video to the user via USB. The feed needs to be able to freeze the image for better analysis and can flip the image in case the user needs a mirrored image. The image will be saved with the press of a button.

Creating the Interface

To create the interface, I used Tkinter, a tkinter package (“Tk interface”) that is the standard Python interface to the Tcl/Tk GUI toolkit. This tool allows me to generate a GUI for the user to interact with the device. The following considerations were taken into account:

  • Initialization of Serial Communication: The initialization of serial communication with the Xiao Seed RP2040 on a COM# port (to be defined depending on the configuration of each computer).
  • Stepper Motor Control: The code needs functions to move the motor left and right (move_motor). A button in the interface controls the motor (left_button and right_button) with arrows that visually tell the user where the motor is supposed to move.
  • LED Control: A function to toggle the LED state (toggle_led) from ON to OFF and the other way around. A button in the interface toggles the LED state (led_button).
  • Video Capture and Image Freeze:
    • Functionality to capture video from the camera, ideally using OpenCV.
    • A button to freeze and unfreeze the camera image (freeze_button).
    • A function to save the frozen image (save_image).
    • A button to save the captured image (save_button).
  • Image Mirroring: A function called (flip_camera) will be added to mirror the camera image if needed. A button toggles image mirroring (flip_button).

Code Implementation

All buttons have a dedicated function attached to them. The interface sends a letter associated with each function. For example, if the motor needs to move to the left, an “L” character will be sent, and the program embedded on the XIAO will respond accordingly.

Python Code


import tkinter as tk
from tkinter import messagebox
import serial
import cv2
from PIL import Image, ImageTk

# Create a function to start the communication
def init_serial(port):
    try:
        ser = serial.Serial(port, 9600)
        return ser
    except:
        messagebox.showerror("Error", "Failed to connect to the specified port.")
        return None

# A function to send to the XIAO the Stepper Motor Control Functions
def move_motor(direction):
    if ser:
        if direction == 'left':
            ser.write(b'L')
        elif direction == 'right':
            ser.write(b'R')

# A function to send to the XIAO the LED Control Function
def toggle_led():
    if ser:
        ser.write(b'T')
        global led_state
        led_state = not led_state
        led_button.config(text="LED OFF" if led_state else "LED ON")

# Video Capture and Image Freeze Functions
def capture_video():
    global cap, frame, freeze
    if not freeze:
        ret, frame = cap.read()
        if ret:
            cv2image = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)
            img = Image.fromarray(cv2image)
            imgtk = ImageTk.PhotoImage(image=img)
            live_feed.imgtk = imgtk
            live_feed.configure(image=imgtk)
        live_feed.after(10, capture_video)

def freeze_image():
    global freeze
    freeze = not freeze
    freeze_button.config(text="Unfreeze Image" if freeze else "Freeze Image")
    if not freeze:
        capture_video()

def save_image():
    if frame is not None:
        cv2.imwrite("captured_image.png", frame)
        messagebox.showinfo("Info", "Image saved successfully.")

# Image Mirroring Function
def flip_camera():
    global frame, flipped
    flipped = not flipped
    if flipped and frame is not None:
        frame = cv2.flip(frame, 1)

# Initialize Tkinter
root = tk.Tk()
root.title("Control Interface")

ser = init_serial("COM3")  # Replace COM3 with the appropriate port
led_state = False
freeze = False
flipped = False

# Video Capture
cap = cv2.VideoCapture(0)
frame = None

# Buttons and Live Feed
led_button = tk.Button(root, text="LED ON", command=toggle_led)
led_button.grid(row=0, column=0, columnspan=2)

left_button = tk.Button(root, text="LEFT", command=lambda: move_motor('left'))
left_button.grid(row=1, column=0)

right_button = tk.Button(root, text="RIGHT", command=lambda: move_motor('right'))
right_button.grid(row=1, column=1)

freeze_button = tk.Button(root, text="Freeze Image", command=freeze_image)
freeze_button.grid(row=2, column=0, columnspan=2)

save_button = tk.Button(root, text="Save Image", command=save_image)
save_button.grid(row=3, column=0, columnspan=2)

flip_button = tk.Button(root, text="Flip Image", command=flip_camera)
flip_button.grid(row=4, column=0, columnspan=2)

live_feed = tk.Label(root)
live_feed.grid(row=0, column=2, rowspan=5)

# Start video capture
capture_video()

root.mainloop()
        

This is the outcome of the first iteration of the interface. It is quite simple but a great example of how to create an interface and how it works:

First Iteration of Interface

The interface naturally had some bugs on the first try as it did not have the proper management of the mirroring image. However, this was quite amazing to see. The integration to the XiaoSeed RP2040 needs to be complemented with the C code created on the Arduino IDE. The code I wrote is the following and has the communication rules that the Interface sends (´R’,’L’,’T’):

Arduino Code


#include 

#define LED1_PIN 26 // This will control the MOSFET for the LED strip
#define STEP_PIN 1
#define DIR_PIN 2

void setup() {
    Serial.begin(9600);
    pinMode(LED1_PIN, OUTPUT);
    pinMode(STEP_PIN, OUTPUT);
    pinMode(DIR_PIN, OUTPUT);
    digitalWrite(LED1_PIN, LOW); // Initially turn off the MOSFET for the LED strip
}

void loop() {
    if (Serial.available() > 0) {
        char command = Serial.read();
        switch (command) {
            case 'T':
                Serial.println("Command received: Toggle LED 1");
                toggleLED(LED1_PIN);
                break;
            case 'L':
                Serial.println("Command received: Move motor left");
                digitalWrite(DIR_PIN, LOW);
                stepMotor(2000);  // Move motor for 2000 milliseconds (2 seconds)
                break;
            case 'R':
                Serial.println("Command received: Move motor right");
                digitalWrite(DIR_PIN, HIGH);
                stepMotor(2000);  // Move motor for 2000 milliseconds (2 seconds)
                break;
            default:
                Serial.println("Unknown command received");
                break;
        }
    }
}

void stepMotor(int duration) {
    unsigned long startTime = millis();
    while (millis() - startTime < duration) {
        digitalWrite(STEP_PIN, HIGH);
        delayMicroseconds(500);  // Adjust delay for speed control
        digitalWrite(STEP_PIN, LOW);
        delayMicroseconds(500);  // Adjust delay for speed control
    }
}

void toggleLED(int pin) {
    digitalWrite(pin, !digitalRead(pin));
}
        

With that code, the interface will be able to connect to the XIAO and execute the corresponding function. It is important to note that the Arduino IDE and the Tkinter interface cannot be executed at the same time as the port communication will be held active in one or the other program and will cause problems that will prevent the interface code from working.

After several iterations, this is the final layout that I got:

Final Layout of Interface

As you can see, I added the FAB Lab logo for Puebla and customized the left and right buttons with images so it is a bit more intuitive and appealing for the user.

Here is a video of the interaction of the interface and the stepper motor and video:

Note: The motor direction may seem to be in the wrong direction, but this is due to the image being flipped beforehand.